Imaging Sensor
   HOME

TheInfoList



OR:

An image sensor or imager is a
sensor A sensor is a device that produces an output signal for the purpose of sensing a physical phenomenon. In the broadest definition, a sensor is a device, module, machine, or subsystem that detects events or changes in its environment and sends ...
that detects and conveys information used to make an
image An image is a visual representation of something. It can be two-dimensional, three-dimensional, or somehow otherwise feed into the visual system to convey information. An image can be an artifact, such as a photograph or other two-dimensiona ...
. It does so by converting the variable
attenuation In physics, attenuation (in some contexts, extinction) is the gradual loss of flux intensity through a medium. For instance, dark glasses attenuate sunlight, lead attenuates X-rays, and water and air attenuate both light and sound at variable att ...
of light
wave In physics, mathematics, and related fields, a wave is a propagating dynamic disturbance (change from equilibrium) of one or more quantities. Waves can be periodic, in which case those quantities oscillate repeatedly about an equilibrium (res ...
s (as they pass through or reflect off objects) into
signals In signal processing, a signal is a function that conveys information about a phenomenon. Any quantity that can vary over space or time can be used as a signal to share messages between observers. The ''IEEE Transactions on Signal Processing'' ...
, small bursts of
current Currents, Current or The Current may refer to: Science and technology * Current (fluid), the flow of a liquid or a gas ** Air current, a flow of air ** Ocean current, a current in the ocean *** Rip current, a kind of water current ** Current (stre ...
that convey the information. The waves can be light or other
electromagnetic radiation In physics, electromagnetic radiation (EMR) consists of waves of the electromagnetic field, electromagnetic (EM) field, which propagate through space and carry momentum and electromagnetic radiant energy. It includes radio waves, microwaves, inf ...
. Image sensors are used in
electronic Electronic may refer to: *Electronics, the science of how to control electric energy in semiconductor * ''Electronics'' (magazine), a defunct American trade journal *Electronic storage, the storage of data using an electronic device *Electronic co ...
imaging devices of both
analog Analog or analogue may refer to: Computing and electronics * Analog signal, in which information is encoded in a continuous variable ** Analog device, an apparatus that operates on analog signals *** Analog electronics, circuits which use analog ...
and
digital Digital usually refers to something using discrete digits, often binary digits. Technology and computing Hardware *Digital electronics, electronic circuits which operate using digital signals **Digital camera, which captures and stores digital i ...
types, which include
digital camera A digital camera is a camera that captures photographs in digital memory. Most cameras produced today are digital, largely replacing those that capture images on photographic film. Digital cameras are now widely incorporated into mobile device ...
s,
camera module {{unreferenced, date=August 2017 A camera module is an image sensor integrated with a lens, control electronics, and an interface like CSI, Ethernet or plain raw low-voltage differential signaling. See also * IP camera * Mobile Industry Process ...
s,
camera phones A camera phone is a mobile phone which is able to capture photographs and often record video using one or more built-in digital cameras. It can also send the resulting image wirelessly and conveniently. The first commercial phone with color ca ...
,
optical mouse An optical mouse is a computer mouse which uses a light source, typically a light-emitting diode (LED), and a light detector, such as an array of photodiodes, to detect movement relative to a surface. Variations of the optical mouse have largely r ...
devices,
medical imaging Medical imaging is the technique and process of imaging the interior of a body for clinical analysis and medical intervention, as well as visual representation of the function of some organs or tissues (physiology). Medical imaging seeks to rev ...
equipment,
night vision Night vision is the ability to see in low-light conditions, either naturally with scotopic vision or through a night-vision device. Night vision requires both sufficient spectral range and sufficient intensity range. Humans have poor night vi ...
equipment such as
thermal imaging Infrared thermography (IRT), thermal video and/or thermal imaging, is a process where a thermal camera captures and creates an image of an object by using infrared radiation emitted from the object in a process, which are examples of infrared i ...
devices,
radar Radar is a detection system that uses radio waves to determine the distance (''ranging''), angle, and radial velocity of objects relative to the site. It can be used to detect aircraft, ships, spacecraft, guided missiles, motor vehicles, w ...
, sonar, and others. As technological change, technology changes, electronic and digital imaging tends to replace chemical and analog imaging. The two main types of electronic image sensors are the charge-coupled device (CCD) and the active-pixel sensor (CMOS sensor). Both CCD and CMOS sensors are based on metal–oxide–semiconductor (MOS) technology, with CCDs based on MOS capacitors and CMOS sensors based on MOSFET (MOS field-effect transistor) amplifiers. Analog sensors for invisible radiation tend to involve vacuum tubes of various kinds, while digital sensors include flat-panel detectors.


CCD vs. CMOS sensors

The two main types of digital image sensors are the charge-coupled device (CCD) and the active-pixel sensor (CMOS sensor), semiconductor device fabrication, fabricated in complementary MOS (CMOS) or N-type semiconductor, N-type MOS (NMOS logic, NMOS or Live MOS) technologies. Both CCD and CMOS sensors are based on MOS technology, with MOS capacitors being the building blocks of a CCD, and MOSFET amplifiers being the building blocks of a CMOS sensor. Cameras integrated in small consumer products generally use CMOS sensors, which are usually cheaper and have lower power consumption in battery powered devices than CCDs. CCD sensors are used for high end broadcast quality video cameras, and CMOS sensors dominate in still photography and consumer goods where overall cost is a major concern. Both types of sensor accomplish the same task of capturing light and converting it into electrical signals. Each cell of a Charge-coupled device, CCD image sensor is an analog device. When light strikes the chip it is held as a small electrical charge in each photo sensor. The charges in the line of pixels nearest to the (one or more) output amplifiers are amplified and output, then each line of pixels shifts its charges one line closer to the amplifiers, filling the empty line closest to the amplifiers. This process is then repeated until all the lines of pixels have had their charge amplified and output. A CMOS image sensor has an amplifier for each pixel compared to the few amplifiers of a CCD. This results in less area for the capture of photons than a CCD, but this problem has been overcome by using microlenses in front of each photodiode, which focus light into the photodiode that would have otherwise hit the amplifier and not been detected. Some CMOS imaging sensors also use back-illuminated sensor, Back-side illumination to increase the number of photons that hit the photodiode. CMOS sensors can potentially be implemented with fewer components, use less power, and/or provide faster readout than CCD sensors. They are also less vulnerable to static electricity discharges. Another design, a hybrid CCD/CMOS architecture (sold under the name "sCMOS") consists of CMOS readout integrated circuits (ROICs) that are bump bonded to a CCD imaging substrate – a technology that was developed for infrared staring arrays and has been adapted to silicon-based detector technology.scmos.com
, home page
Another approach is to utilize the very fine dimensions available in modern CMOS technology to implement a CCD like structure entirely in CMOS technology: such structures can be achieved by separating individual poly-silicon gates by a very small gap; though still a product of research hybrid sensors can potentially harness the benefits of both CCD and CMOS imagers.ieee.org - CCD in CMOS
Padmakumar R. Rao et al., "CCD structures implemented in standard 0.18 µm CMOS technology"


Performance

There are many parameters that can be used to evaluate the performance of an image sensor, including dynamic range, signal-to-noise ratio, and low-light sensitivity. For sensors of comparable types, the signal-to-noise ratio and dynamic range improve as the Image sensor format#Sensor size, size increases.


Exposure-time control

Exposure time of image sensors is generally controlled by either a conventional mechanical shutter (photography), shutter, as in film cameras, or by an electronic shutter. Electronic shuttering can be "global", in which case the entire image sensor area's accumulation of photoelectrons starts and stops simultaneously, or "rolling" in which case the exposure interval of each row immediate precedes that row's readout, in a process that "rolls" across the image frame (typically from top to bottom in landscape format). Global electronic shuttering is less common, as it requires "storage" circuits to hold charge from the end of the exposure interval until the readout process gets there, typically a few milliseconds later.


Color separation

There are several main types of color image sensors, differing by the type of color-separation mechanism: * Bayer filter, Bayer-filter sensor, low-cost and most common, using a color filter array that passes red, green, and blue light to selected active-pixel sensor, pixel sensors. Each individual sensor element is made sensitive to red, green, or blue by means of a color gel made of chemical dyes patterned over the elements. The most common filter matrix, the Bayer pattern, uses two green pixels for each red and blue. This results in less resolution for red and blue colors. The missing color samples may interpolated using a demosaicing algorithm, or ignored altogether by lossy compression. In order to improve color information, techniques like color co-site sampling use a Piezoelectricity, piezo mechanism to shift the color sensor in pixel steps. * Foveon X3 sensor, using an array of layered pixel sensors, separating light via the inherent wavelength-dependent absorption property of silicon, such that every location senses all three color channels. This method is similar to how color film for photography works. * Three-CCD camera, 3CCD, using three discrete image sensors, with the color separation done by a dichroic prism. The dichroic elements provide a sharper color separation, thus improving color quality. Because each sensor is equally sensitive within its passband, and at full resolution, 3-CCD sensors produce better color quality and better low light performance. 3-CCD sensors produce a full Chroma subsampling, 4:4:4 signal, which is preferred in television broadcasting, video editing and chroma key visual effects.


Specialty sensors

Special sensors are used in various applications such as thermography, creation of multi-spectral images, Laryngoscopy, video laryngoscopes, gamma cameras, sensor arrays for x-rays, and other highly sensitive arrays for astronomy. While in general digital cameras use a flat sensor, Sony prototyped a curved sensor in 2014 to reduce/eliminate Petzval field curvature that occurs with a flat sensor. Use of a curved sensor allows a shorter and smaller diameter of the lens with reduced elements and components with greater aperture and reduced light fall-off at the edge of the photo.


History

Early analog sensors for visible light were video camera tubes. They date back to the 1930s, and several types were developed up until the 1980s. By the early 1990s, they had been replaced by modern solid-state electronic, solid-state CCD image sensors. The basis for modern solid-state image sensors is MOS technology, which originates from the invention of the MOSFET by Mohamed M. Atalla and Dawon Kahng at Bell Labs in 1959. Later research on MOS technology led to the development of solid-state semiconductor image sensors, including the charge-coupled device (CCD) and later the active-pixel sensor (CMOS sensor). The passive-pixel sensor (PPS) was the precursor to the active-pixel sensor (APS). A PPS consists of passive pixels which are read out without amplifier, amplification, with each pixel consisting of a photodiode and a MOSFET switch. It is a type of photodiode array, with pixels containing a p-n junction, integrated capacitor, and MOSFETs as selection transistors. A photodiode array was proposed by G. Weckler in 1968. This was the basis for the PPS. These early photodiode arrays were complex and impractical, requiring selection transistors to be fabricated within each pixel, along with integrated circuit, on-chip multiplexer circuits. The noise (electronics), noise of photodiode arrays was also a limitation to performance, as the photodiode readout memory bus, bus capacitance resulted in increased noise level. Correlated double sampling (CDS) could also not be used with a photodiode array without external computer memory, memory. However, in 1914 Deputy Consul General Carl R. Loop, reported to the state department in a Consular Report on Archibald Low, Archibald M. Low's Televista system that "It is stated that the selenium in the transmitting screen may be replaced by any Diamagnetism, diamagnetic material". In June 2022, Samsung Electronics announced that it had created a 200 million pixel image sensor. The 200MP ISOCELL HP3 has 0.56 micrometer pixels with Samsung reporting that previous sensors had 064 micrometer pixels, a 12% decrease since 2019. The new sensor contains 200 million pixels in a 2 x 1.4 inch lens.


Charge-coupled device

The charge-coupled device (CCD) was invented by Willard S. Boyle and George E. Smith at Bell Labs in 1969. While researching MOS technology, they realized that an electric charge was the analogy of the magnetic bubble and that it could be stored on a tiny MOS capacitor. As it was fairly straightforward to semiconductor device fabrication, fabricate a series of MOS capacitors in a row, they connected a suitable voltage to them so that the charge could be stepped along from one to the next. The CCD is a semiconductor circuit that was later used in the first digital video cameras for television broadcasting. Early CCD sensors suffered from shutter lag. This was largely resolved with the invention of the pinned photodiode (PPD). It was invented by Nobukazu Teranishi, Hiromitsu Shiraki and Yasuo Ishihara at NEC in 1980. It was a photodetector structure with low lag, low noise (electronics), noise, high quantum efficiency and low dark current (physics), dark current. In 1987, the PPD began to be incorporated into most CCD devices, becoming a fixture in consumer electronic video cameras and then digital still cameras. Since then, the PPD has been used in nearly all CCD sensors and then CMOS sensors.


Active-pixel sensor

The NMOS logic, NMOS active-pixel sensor (APS) was invented by Olympus Corporation, Olympus in Japan during the mid-1980s. This was enabled by advances in MOS semiconductor device fabrication, with MOSFET scaling reaching smaller List of semiconductor scale examples, micron and then sub-micron levels. The first NMOS APS was fabricated by Tsutomu Nakamura's team at Olympus in 1985. The CMOS active-pixel sensor (CMOS sensor) was later improved by a group of scientists at the NASA Jet Propulsion Laboratory in 1993. By 2007, sales of CMOS sensors had surpassed CCD sensors. By the 2010s, CMOS sensors largely displaced CCD sensors in all new applications.


Other image sensors

The first commercial
digital camera A digital camera is a camera that captures photographs in digital memory. Most cameras produced today are digital, largely replacing those that capture images on photographic film. Digital cameras are now widely incorporated into mobile device ...
, the Cromemco Cyclops in 1975, used a 32×32 MOS image sensor. It was a modified MOS dynamic Random-access memory, RAM (Dynamic random-access memory, DRAM) memory chip. MOS image sensors are widely used in
optical mouse An optical mouse is a computer mouse which uses a light source, typically a light-emitting diode (LED), and a light detector, such as an array of photodiodes, to detect movement relative to a surface. Variations of the optical mouse have largely r ...
technology. The first optical mouse, invented by Richard F. Lyon at Xerox in 1980, used a 6 µm process, 5µm NMOS logic, NMOS integrated circuit sensor chip. Since the first commercial optical mouse, the IntelliMouse introduced in 1999, most optical mouse devices use CMOS sensors. In February 2018, researchers at Dartmouth College announced a new image sensing technology that the researchers call QIS, for Quanta Image Sensor. Instead of pixels, QIS chips have what the researchers call "jots." Each jot can detect a single particle of light, called a photon.


See also

* List of sensors used in digital cameras * Contact image sensor (CIS) * Electro-optical sensor * Video camera tube * Semiconductor detector * Fill factor (image sensor), Fill factor * Full-frame digital SLR * Image resolution * Image sensor format, the sizes and shapes of common image sensors * Color filter array, mosaic of tiny color filters over color image sensors * Sensitometry, the scientific study of light-sensitive materials * History of television, the development of electronic imaging technology since the 1880s * List of large sensor interchangeable-lens video cameras * Oversampled binary image sensor * Computer vision * Push broom scanner * Whisk broom scanner


References


External links


Digital Camera Sensor Performance Summary
by Roger Clark * (with graphical buckets and rainwater analogies) {{Authority control Image sensors, Digital photography MOSFETs